Face Generation

In this project, you'll use generative adversarial networks to generate new images of faces.

Get the Data

You'll be using two datasets in this project:

  • MNIST
  • CelebA

Since the celebA dataset is complex and you're doing GANs in a project for the first time, we want you to test your neural network on MNIST before CelebA. Running the GANs on MNIST will allow you to see how well your model trains sooner.

If you're using FloydHub, set data_dir to "/input" and use the FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".

In [90]:
data_dir = '../data'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
# data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)
Found mnist Data
Found celeba Data

Explore the Data

MNIST

As you're aware, the MNIST dataset contains images of handwritten digits. You can view the first number of examples by changing show_n_images.

In [91]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')
Out[91]:
<matplotlib.image.AxesImage at 0x7fde2b9c04e0>

CelebA

The CelebFaces Attributes Dataset (CelebA) dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations. You can view the first number of examples by changing show_n_images.

In [92]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))
Out[92]:
<matplotlib.image.AxesImage at 0x7fdea209ac88>

Preprocess the Data

Since the project's main focus is on building the GANs, we'll preprocess the data for you. The values of the MNIST and CelebA dataset will be in the range of -0.5 to 0.5 of 28x28 dimensional images. The CelebA images will be cropped to remove parts of the image that don't include a face, then resized down to 28x28.

The MNIST images are black and white images with a single color channel while the CelebA images have 3 color channels (RGB color channel).

Build the Neural Network

You'll build the components necessary to build a GANs by implementing the following functions below:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU

In [93]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.0.0
Default GPU Device: /gpu:0

Input

Implement the model_inputs function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Real input images placeholder with rank 4 using image_width, image_height, and image_channels.
  • Z input placeholder with rank 2 using z_dim.
  • Learning rate placeholder with rank 0.

Return the placeholders in the following the tuple (tensor of real input images, tensor of z data)

In [94]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    # TODO: Implement Function
    input_real = tf.placeholder(tf.float32, [None, image_width, image_height, image_channels],name="input_real")
    input_z = tf.placeholder(tf.float32, [None, z_dim], name='input_z')
    learning_rate = tf.placeholder(tf.float32, name='learning_rate')

    return input_real, input_z, learning_rate


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)
Tests Passed

Discriminator

Implement discriminator to create a discriminator neural network that discriminates on images. This function should be able to reuse the variabes in the neural network. Use tf.variable_scope with a scope name of "discriminator" to allow the variables to be reused. The function should return a tuple of (tensor output of the generator, tensor logits of the generator).

In [95]:
def leak_relu(x, alpha=0.1, name='leaky_relu'):
    return tf.maximum(x, alpha * x, name=name)
In [112]:
def discriminator(images, reuse=False, keep_prob=0.9):
    """
    Create the discriminator network
    :param image: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # TODO: Implement Function
    with tf.variable_scope('discriminator', reuse=reuse):
        alpha = 0.2
        x1 = tf.layers.conv2d(images, 64, 3, strides=2, padding="SAME", kernel_initializer=tf.contrib.layers.xavier_initializer())
        x1 = tf.nn.dropout(x1, keep_prob)
        relu1 = leak_relu(x1, alpha, name='leaky_relu1')
        # 14x14x64

        x2 = tf.layers.conv2d(x1, 128, 3, strides=1, padding="SAME", kernel_initializer=tf.contrib.layers.xavier_initializer())
        x2 = tf.nn.dropout(x2, keep_prob)
        bn2 = tf.layers.batch_normalization(x2, training=True)
        relu2 = leak_relu(bn2, alpha, name='leaky_relu2')
        # 14x14x128

        x3 = tf.layers.conv2d(relu2, 256, 3, strides=2, padding="SAME", kernel_initializer=tf.contrib.layers.xavier_initializer())
        x3 = tf.nn.dropout(x3, keep_prob)
        bn3 = tf.layers.batch_normalization(x3, training=True)
        relu3 = leak_relu(bn3, alpha, name='leaky_relu3')
        # 7 x 7 x 256
        # Flatten it
        flat = tf.reshape(relu3, [-1, 7*7*256])
        logits = tf.layers.dense(flat,1)
        out = tf.sigmoid(logits)

        return out, logits


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)
Tests Passed

Generator

Implement generator to generate an image using z. This function should be able to reuse the variabes in the neural network. Use tf.variable_scope with a scope name of "generator" to allow the variables to be reused. The function should return the generated 28 x 28 x out_channel_dim images.

In [113]:
def generator(z, out_channel_dim, is_train=True):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    # TODO: Implement Function
    with tf.variable_scope('generator', reuse= not is_train):
        alpha = 0.2
        x1 = tf.layers.dense(z, 7 * 7 * 512)

        # 7x7x256 > 14x14x128 > 14x14x64 > 28x28x1/3
        # reshape it to start the convolution stack
        x1 = tf.reshape(x1, [-1, 7, 7, 512])
        bn1 = tf.layers.batch_normalization(x1, training=is_train)
        relu1 = leak_relu(bn1, alpha, name='leaky_relu1')
        # 7x7x256

        x2 = tf.layers.conv2d_transpose(relu1, 64, 3, strides=2, padding='SAME')
        bn2 = tf.layers.batch_normalization(x2, training=is_train)
        relu2 = leak_relu(bn2, alpha, name='leaky_relu2')
        # 14 * 14 * 128

#         x3 = tf.layers.conv2d_transpose(relu2, 64, 1, strides=1, padding='SAME')
#         bn3 = tf.layers.batch_normalization(x3, training=is_train)
#         relu3 = leak_relu(bn3, alpha, name='leaky_relu3')
        # 14 * 14 * 64

        # output layer
        logits = tf.layers.conv2d_transpose(relu2, out_channel_dim, 3, strides=2, padding='SAME')

        out = tf.tanh(logits)

        return out


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)
Tests Passed

Loss

Implement model_loss to build the GANs for training and calculate the loss. The function should return a tuple of (discriminator loss, generator loss). Use the following functions you implemented:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)
In [114]:
def model_loss(input_real, input_z, out_channel_dim, smooth=0.1):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    # TODO: Implement Function
    g_model = generator(input_z, out_channel_dim)
    
    d_model_real, d_logits_real = discriminator(input_real)
    # fake是根据input_z生成的模型
    d_model_fake, d_logits_fake = discriminator(g_model, reuse=True)
    
    # 真实图片都是1 生成图片都是0
    # To prevent discriminator from being too strong as well as to help it generalise better the discriminator labels are reduced from 1 to 0.9
    
    d_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_real, labels=tf.ones_like(d_model_real) * (1 - smooth)))
    d_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.zeros_like(d_model_fake)))
    
    # 要让生成的图片尽量真实
    g_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.ones_like(d_model_fake)))
    
    d_loss = d_loss_real + d_loss_fake

    return d_loss, g_loss



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)
Tests Passed

Optimization

Implement model_opt to create the optimization operations for the GANs. Use tf.trainable_variables to get all the trainable variables. Filter the variables with names that are in the discriminator and generator scope names. The function should return a tuple of (discriminator training operation, generator training operation).

In [115]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # TODO: Implement Function
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]

    d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
    g_updates = [opt for opt in update_ops if opt.name.startswith('generator')]
    with tf.control_dependencies(g_updates):
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1).minimize(g_loss, var_list=g_vars)


    return d_train_opt, g_train_opt


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)
Tests Passed

Neural Network Training

Show Output

Use this function to show the current output of the generator during training. It will help you determine how well the GANs is training.

In [116]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

Train

Implement train to build and train the GANs. Use the following functions you implemented:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

Use the show_generator_output to show generator output while you train. Running show_generator_output for every batch will drastically increase training time and increase the size of the notebook. It's recommended to print the generator output every 100 batches.

In [117]:
import pickle as pkl
def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    
    
    steps = 0
    samples, losses = [], []
    sample_z = np.random.uniform(-1, 1, size=(batch_size, z_dim))
    
    # TODO: Build Model
    input_real, input_z, lr = model_inputs(*data_shape[1:], z_dim)
    d_loss, g_loss = model_loss(input_real, input_z, 1 if data_image_mode=='L' else 3)
    d_train_opt, g_train_opt = model_opt(d_loss, g_loss, learning_rate, beta1)
    saver = tf.train.Saver()
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                # TODO: Train Model
                steps += 1
                #Sample random noise for G
                batch_z = np.random.uniform(-1, 1, size=[batch_size, z_dim])
                _ = sess.run(d_train_opt, feed_dict={input_real:batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))),input_z: batch_z})
                _ = sess.run(g_train_opt, feed_dict={input_z:batch_z})
                if steps % 20 == 0:
                    # At the end of each epoch, get the losses and print them out
                    train_loss_d = d_loss.eval({input_real:batch_images, input_z: batch_z})
                    train_loss_g = g_loss.eval({input_z:batch_z, input_real:batch_images})
                    print("Epoch {}/{}...".format(epoch_i+1, epochs),
                          "Discriminator Loss: {:.4f}...".format(train_loss_d),
                          "Generator Loss: {:.4f}".format(train_loss_g))
#                     losses.append((train_loss_d, train_loss_g))
#                     show_generator_output(sess, 5, input_z, 1 if data_image_mode=='L' else 3, data_image_mode)
                    
                if steps % 60 == 0:
#                     show_generator_output(sess, 5, input_z, 1 if data_image_mode=='L' else 3, data_image_mode)
#                     gen_samples = sess.run(
#                                    generator(input_z, 1 if data_image_mode=='L' else 3, is_train=False),
#                                    feed_dict={input_z: sample_z})
#                     samples.append(gen_samples)
                    show_generator_output(sess, 16, input_z, 1 if data_image_mode=='L' else 3, data_image_mode)
#                     _ = view_samples(-1, samples, 5, 10, figsize=figsize)
#                     plt.show()

#         saver.save(sess, './output/xavier_disc_64_128_256_gen_512_64_alpha_2_beta_5/face_generator.ckpt')

#         with open('./output/xavier_disc_64_128_256_gen_512_64_alpha_2_beta_5/face_samples.pkl', 'wb') as f:
#             pkl.dump(samples, f)
    print("Training done!")
    return losses, samples
                
                

MNIST

Test your GANs architecture on MNIST. After 2 epochs, the GANs should be able to generate images that look like handwritten digits. Make sure the loss of the generator is lower than the loss of the discriminator or close to 0.

In [58]:
%%time
batch_size = 64
z_dim = 100
learning_rate = 0.0005
beta1 = 0.5

tf.reset_default_graph()

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)
Epoch 1/2... Discriminator Loss: 2.9967... Generator Loss: 0.0873
Epoch 1/2... Discriminator Loss: 2.4263... Generator Loss: 0.1586
Epoch 1/2... Discriminator Loss: 2.0070... Generator Loss: 0.2972
Epoch 1/2... Discriminator Loss: 1.8545... Generator Loss: 0.4217
Epoch 1/2... Discriminator Loss: 1.9880... Generator Loss: 0.3479
Epoch 1/2... Discriminator Loss: 1.9710... Generator Loss: 0.4089
Epoch 1/2... Discriminator Loss: 2.1204... Generator Loss: 0.3697
Epoch 1/2... Discriminator Loss: 2.1445... Generator Loss: 0.4629
Epoch 1/2... Discriminator Loss: 2.0478... Generator Loss: 0.4819
Epoch 1/2... Discriminator Loss: 2.1707... Generator Loss: 0.4759
Epoch 1/2... Discriminator Loss: 2.1812... Generator Loss: 0.5447
Epoch 1/2... Discriminator Loss: 2.1565... Generator Loss: 0.5156
Epoch 1/2... Discriminator Loss: 2.1272... Generator Loss: 0.5067
Epoch 1/2... Discriminator Loss: 2.0909... Generator Loss: 0.4314
Epoch 1/2... Discriminator Loss: 2.1603... Generator Loss: 0.4585
Epoch 1/2... Discriminator Loss: 2.0684... Generator Loss: 0.4761
Epoch 1/2... Discriminator Loss: 2.1799... Generator Loss: 0.4412
Epoch 1/2... Discriminator Loss: 2.1355... Generator Loss: 0.4236
Epoch 1/2... Discriminator Loss: 2.0496... Generator Loss: 0.6362
Epoch 1/2... Discriminator Loss: 2.2440... Generator Loss: 0.3661
Epoch 1/2... Discriminator Loss: 2.3142... Generator Loss: 0.3979
Epoch 1/2... Discriminator Loss: 2.3094... Generator Loss: 0.3921
Epoch 1/2... Discriminator Loss: 2.3005... Generator Loss: 0.3950
Epoch 1/2... Discriminator Loss: 2.2501... Generator Loss: 0.6813
Epoch 1/2... Discriminator Loss: 2.2868... Generator Loss: 0.3786
Epoch 1/2... Discriminator Loss: 2.1722... Generator Loss: 0.5455
Epoch 1/2... Discriminator Loss: 2.1941... Generator Loss: 0.6455
Epoch 1/2... Discriminator Loss: 2.3128... Generator Loss: 0.4863
Epoch 1/2... Discriminator Loss: 2.4001... Generator Loss: 0.3838
Epoch 1/2... Discriminator Loss: 2.2811... Generator Loss: 0.5061
Epoch 1/2... Discriminator Loss: 2.2441... Generator Loss: 0.4165
Epoch 1/2... Discriminator Loss: 2.3377... Generator Loss: 0.4756
Epoch 1/2... Discriminator Loss: 2.2918... Generator Loss: 0.6265
Epoch 1/2... Discriminator Loss: 2.4635... Generator Loss: 0.3306
Epoch 1/2... Discriminator Loss: 2.1827... Generator Loss: 0.6217
Epoch 1/2... Discriminator Loss: 2.4226... Generator Loss: 0.5106
Epoch 1/2... Discriminator Loss: 2.2144... Generator Loss: 0.5016
Epoch 1/2... Discriminator Loss: 2.3383... Generator Loss: 0.5337
Epoch 1/2... Discriminator Loss: 2.2978... Generator Loss: 0.3960
Epoch 1/2... Discriminator Loss: 2.3245... Generator Loss: 0.4084
Epoch 1/2... Discriminator Loss: 2.2608... Generator Loss: 0.4415
Epoch 1/2... Discriminator Loss: 2.2726... Generator Loss: 0.4660
Epoch 1/2... Discriminator Loss: 2.3054... Generator Loss: 0.5878
Epoch 1/2... Discriminator Loss: 2.3033... Generator Loss: 0.4682
Epoch 1/2... Discriminator Loss: 2.2548... Generator Loss: 0.4556
Epoch 1/2... Discriminator Loss: 2.2937... Generator Loss: 0.5389
Epoch 2/2... Discriminator Loss: 2.2755... Generator Loss: 1.2666
Epoch 2/2... Discriminator Loss: 2.2740... Generator Loss: 1.4105
Epoch 2/2... Discriminator Loss: 2.3141... Generator Loss: 1.5055
Epoch 2/2... Discriminator Loss: 2.4542... Generator Loss: 1.5349
Epoch 2/2... Discriminator Loss: 2.2904... Generator Loss: 0.7932
Epoch 2/2... Discriminator Loss: 2.3181... Generator Loss: 1.4943
Epoch 2/2... Discriminator Loss: 2.2602... Generator Loss: 0.4524
Epoch 2/2... Discriminator Loss: 2.2728... Generator Loss: 0.5101
Epoch 2/2... Discriminator Loss: 2.1950... Generator Loss: 0.8639
Epoch 2/2... Discriminator Loss: 2.2647... Generator Loss: 0.4791
Epoch 2/2... Discriminator Loss: 2.1673... Generator Loss: 0.4775
Epoch 2/2... Discriminator Loss: 2.2048... Generator Loss: 0.5294
Epoch 2/2... Discriminator Loss: 2.2323... Generator Loss: 0.4435
Epoch 2/2... Discriminator Loss: 2.2121... Generator Loss: 0.5216
Epoch 2/2... Discriminator Loss: 2.1294... Generator Loss: 1.1922
Epoch 2/2... Discriminator Loss: 2.2957... Generator Loss: 1.6923
Epoch 2/2... Discriminator Loss: 2.1804... Generator Loss: 0.7548
Epoch 2/2... Discriminator Loss: 2.1246... Generator Loss: 0.4687
Epoch 2/2... Discriminator Loss: 2.0932... Generator Loss: 0.8545
Epoch 2/2... Discriminator Loss: 2.2551... Generator Loss: 1.5844
Epoch 2/2... Discriminator Loss: 2.3404... Generator Loss: 0.8427
Epoch 2/2... Discriminator Loss: 2.3941... Generator Loss: 1.6027
Epoch 2/2... Discriminator Loss: 2.3851... Generator Loss: 1.6232
Epoch 2/2... Discriminator Loss: 2.4351... Generator Loss: 1.6505
Epoch 2/2... Discriminator Loss: 2.3558... Generator Loss: 1.2502
Epoch 2/2... Discriminator Loss: 2.3042... Generator Loss: 1.4744
Epoch 2/2... Discriminator Loss: 2.3585... Generator Loss: 0.4767
Epoch 2/2... Discriminator Loss: 2.4348... Generator Loss: 0.3673
Epoch 2/2... Discriminator Loss: 2.2991... Generator Loss: 0.4704
Epoch 2/2... Discriminator Loss: 2.3180... Generator Loss: 0.4047
Epoch 2/2... Discriminator Loss: 2.2377... Generator Loss: 1.3972
Epoch 2/2... Discriminator Loss: 2.5047... Generator Loss: 0.3729
Epoch 2/2... Discriminator Loss: 2.3647... Generator Loss: 1.0489
Epoch 2/2... Discriminator Loss: 2.3496... Generator Loss: 1.1409
Epoch 2/2... Discriminator Loss: 2.4281... Generator Loss: 1.6371
Epoch 2/2... Discriminator Loss: 2.3585... Generator Loss: 1.0952
Epoch 2/2... Discriminator Loss: 2.3229... Generator Loss: 1.0331
Epoch 2/2... Discriminator Loss: 2.3346... Generator Loss: 1.3834
Epoch 2/2... Discriminator Loss: 2.3706... Generator Loss: 1.0138
Epoch 2/2... Discriminator Loss: 2.4100... Generator Loss: 1.0939
Epoch 2/2... Discriminator Loss: 2.7143... Generator Loss: 2.0834
Epoch 2/2... Discriminator Loss: 2.4888... Generator Loss: 1.4826
Epoch 2/2... Discriminator Loss: 2.5329... Generator Loss: 1.0728
Epoch 2/2... Discriminator Loss: 2.4075... Generator Loss: 1.3941
Epoch 2/2... Discriminator Loss: 2.3017... Generator Loss: 0.7321
Epoch 2/2... Discriminator Loss: 2.4320... Generator Loss: 0.3670
Epoch 2/2... Discriminator Loss: 2.5535... Generator Loss: 1.3152
Training done!
CPU times: user 3min, sys: 50.7 s, total: 3min 51s
Wall time: 4min 55s

CelebA

Run your GANs on CelebA. It will take around 20 minutes on the average GPU to run one epoch. You can run the whole epoch or stop when it starts to generate realistic faces.

In [72]:
%%time
batch_size = 64
z_dim = 100
learning_rate = 0.0005
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 0.4501... Generator Loss: 4.4163
Epoch 1/1... Discriminator Loss: 0.6122... Generator Loss: 2.2523
Epoch 1/1... Discriminator Loss: 0.5873... Generator Loss: 7.7060
Epoch 1/1... Discriminator Loss: 2.5292... Generator Loss: 5.0456
Epoch 1/1... Discriminator Loss: 1.6196... Generator Loss: 0.5593
Epoch 1/1... Discriminator Loss: 0.5597... Generator Loss: 2.3401
Epoch 1/1... Discriminator Loss: 0.4324... Generator Loss: 4.1046
Epoch 1/1... Discriminator Loss: 0.4310... Generator Loss: 3.3067
Epoch 1/1... Discriminator Loss: 0.4128... Generator Loss: 3.4642
Epoch 1/1... Discriminator Loss: 0.7264... Generator Loss: 1.5166
Epoch 1/1... Discriminator Loss: 0.3643... Generator Loss: 7.6325
Epoch 1/1... Discriminator Loss: 0.3547... Generator Loss: 7.2232
Epoch 1/1... Discriminator Loss: 0.6644... Generator Loss: 1.5334
Epoch 1/1... Discriminator Loss: 0.3874... Generator Loss: 4.9837
Epoch 1/1... Discriminator Loss: 0.3549... Generator Loss: 7.5092
Epoch 1/1... Discriminator Loss: 0.3853... Generator Loss: 5.7130
Epoch 1/1... Discriminator Loss: 0.3634... Generator Loss: 7.1974
Epoch 1/1... Discriminator Loss: 0.3855... Generator Loss: 7.5146
Epoch 1/1... Discriminator Loss: 0.6107... Generator Loss: 1.6310
Epoch 1/1... Discriminator Loss: 0.3933... Generator Loss: 4.5819
Epoch 1/1... Discriminator Loss: 1.0715... Generator Loss: 6.0609
Epoch 1/1... Discriminator Loss: 0.3637... Generator Loss: 7.0400
Epoch 1/1... Discriminator Loss: 0.6064... Generator Loss: 2.2847
Epoch 1/1... Discriminator Loss: 0.6657... Generator Loss: 2.2409
Epoch 1/1... Discriminator Loss: 0.4372... Generator Loss: 4.9936
Epoch 1/1... Discriminator Loss: 0.4011... Generator Loss: 6.9508
Epoch 1/1... Discriminator Loss: 0.6884... Generator Loss: 1.7950
Epoch 1/1... Discriminator Loss: 0.5135... Generator Loss: 2.6521
Epoch 1/1... Discriminator Loss: 1.0609... Generator Loss: 1.2539
Epoch 1/1... Discriminator Loss: 0.4733... Generator Loss: 4.4130
Epoch 1/1... Discriminator Loss: 0.4543... Generator Loss: 4.0352
Epoch 1/1... Discriminator Loss: 0.3698... Generator Loss: 5.4377
Epoch 1/1... Discriminator Loss: 0.5217... Generator Loss: 4.1866
Epoch 1/1... Discriminator Loss: 0.8135... Generator Loss: 1.7730
Epoch 1/1... Discriminator Loss: 0.9622... Generator Loss: 1.4822
Epoch 1/1... Discriminator Loss: 0.5398... Generator Loss: 3.7573
Epoch 1/1... Discriminator Loss: 0.5040... Generator Loss: 4.1371
Epoch 1/1... Discriminator Loss: 0.6500... Generator Loss: 3.0149
Epoch 1/1... Discriminator Loss: 0.6545... Generator Loss: 3.1561
Epoch 1/1... Discriminator Loss: 0.4323... Generator Loss: 5.3612
Epoch 1/1... Discriminator Loss: 1.1597... Generator Loss: 1.2721
Epoch 1/1... Discriminator Loss: 1.5256... Generator Loss: 0.8312
Epoch 1/1... Discriminator Loss: 0.5835... Generator Loss: 3.0755
Epoch 1/1... Discriminator Loss: 0.6889... Generator Loss: 2.9523
Epoch 1/1... Discriminator Loss: 0.9856... Generator Loss: 1.5461
Epoch 1/1... Discriminator Loss: 0.7015... Generator Loss: 3.7667
Epoch 1/1... Discriminator Loss: 1.2720... Generator Loss: 1.0492
Epoch 1/1... Discriminator Loss: 0.9187... Generator Loss: 1.7693
Epoch 1/1... Discriminator Loss: 0.5686... Generator Loss: 4.6115
Epoch 1/1... Discriminator Loss: 0.5194... Generator Loss: 3.6595
Epoch 1/1... Discriminator Loss: 0.6261... Generator Loss: 3.2057
Epoch 1/1... Discriminator Loss: 0.5023... Generator Loss: 4.2533
Epoch 1/1... Discriminator Loss: 0.3634... Generator Loss: 8.8413
Epoch 1/1... Discriminator Loss: 1.4200... Generator Loss: 0.8026
Epoch 1/1... Discriminator Loss: 0.4501... Generator Loss: 4.5643
Epoch 1/1... Discriminator Loss: 0.5117... Generator Loss: 4.1056
Epoch 1/1... Discriminator Loss: 0.6302... Generator Loss: 3.0836
Epoch 1/1... Discriminator Loss: 0.6856... Generator Loss: 6.7570
Epoch 1/1... Discriminator Loss: 0.5046... Generator Loss: 3.3629
Epoch 1/1... Discriminator Loss: 0.4500... Generator Loss: 4.5872
Epoch 1/1... Discriminator Loss: 0.6018... Generator Loss: 2.9539
Epoch 1/1... Discriminator Loss: 0.7565... Generator Loss: 2.0042
Epoch 1/1... Discriminator Loss: 0.8627... Generator Loss: 1.7875
Epoch 1/1... Discriminator Loss: 0.9229... Generator Loss: 4.4564
Epoch 1/1... Discriminator Loss: 1.1480... Generator Loss: 1.0615
Epoch 1/1... Discriminator Loss: 1.2075... Generator Loss: 0.9683
Epoch 1/1... Discriminator Loss: 1.1352... Generator Loss: 0.9111
Epoch 1/1... Discriminator Loss: 0.4294... Generator Loss: 3.9313
Epoch 1/1... Discriminator Loss: 0.6194... Generator Loss: 2.5442
Epoch 1/1... Discriminator Loss: 1.0844... Generator Loss: 1.3524
Epoch 1/1... Discriminator Loss: 0.9887... Generator Loss: 1.1465
Epoch 1/1... Discriminator Loss: 0.5401... Generator Loss: 3.4337
Epoch 1/1... Discriminator Loss: 0.4432... Generator Loss: 5.1047
Epoch 1/1... Discriminator Loss: 0.5660... Generator Loss: 3.0856
Epoch 1/1... Discriminator Loss: 0.5527... Generator Loss: 3.1452
Epoch 1/1... Discriminator Loss: 3.2614... Generator Loss: 0.1156
Epoch 1/1... Discriminator Loss: 2.2158... Generator Loss: 0.3373
Epoch 1/1... Discriminator Loss: 0.5267... Generator Loss: 3.5556
Epoch 1/1... Discriminator Loss: 0.5994... Generator Loss: 3.1627
Epoch 1/1... Discriminator Loss: 0.4105... Generator Loss: 4.9116
Epoch 1/1... Discriminator Loss: 0.4262... Generator Loss: 3.2495
Epoch 1/1... Discriminator Loss: 0.4995... Generator Loss: 2.6995
Epoch 1/1... Discriminator Loss: 0.8908... Generator Loss: 1.3850
Epoch 1/1... Discriminator Loss: 0.5331... Generator Loss: 3.0565
Epoch 1/1... Discriminator Loss: 0.6100... Generator Loss: 2.3766
Epoch 1/1... Discriminator Loss: 0.4721... Generator Loss: 3.0901
Epoch 1/1... Discriminator Loss: 1.2551... Generator Loss: 1.0643
Epoch 1/1... Discriminator Loss: 1.2090... Generator Loss: 1.2035
Epoch 1/1... Discriminator Loss: 0.6707... Generator Loss: 1.9451
Epoch 1/1... Discriminator Loss: 0.6129... Generator Loss: 2.1053
Epoch 1/1... Discriminator Loss: 0.3817... Generator Loss: 4.9973
Epoch 1/1... Discriminator Loss: 0.6113... Generator Loss: 3.8385
Epoch 1/1... Discriminator Loss: 0.5628... Generator Loss: 2.8284
Epoch 1/1... Discriminator Loss: 0.6869... Generator Loss: 1.6944
Epoch 1/1... Discriminator Loss: 0.8556... Generator Loss: 1.3060
Epoch 1/1... Discriminator Loss: 0.7719... Generator Loss: 1.8315
Epoch 1/1... Discriminator Loss: 0.7810... Generator Loss: 2.0256
Epoch 1/1... Discriminator Loss: 0.6731... Generator Loss: 2.7895
Epoch 1/1... Discriminator Loss: 0.5204... Generator Loss: 3.7929
Epoch 1/1... Discriminator Loss: 1.0105... Generator Loss: 1.1707
Epoch 1/1... Discriminator Loss: 0.4464... Generator Loss: 4.3360
Epoch 1/1... Discriminator Loss: 0.4768... Generator Loss: 3.5747
Epoch 1/1... Discriminator Loss: 0.5070... Generator Loss: 4.6210
Training done!
CPU times: user 7min 16s, sys: 1min 30s, total: 8min 47s
Wall time: 11min 49s
In [80]:
%%time
batch_size = 64
z_dim = 100
learning_rate = 0.0005
beta1 = 0.5
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 0.6361... Generator Loss: 6.8123
Epoch 1/1... Discriminator Loss: 0.4958... Generator Loss: 18.7370
Epoch 1/1... Discriminator Loss: 0.5469... Generator Loss: 3.6438
Epoch 1/1... Discriminator Loss: 0.6680... Generator Loss: 2.4093
Epoch 1/1... Discriminator Loss: 0.9475... Generator Loss: 1.3399
Epoch 1/1... Discriminator Loss: 0.5012... Generator Loss: 3.4603
Epoch 1/1... Discriminator Loss: 0.4221... Generator Loss: 5.5349
Epoch 1/1... Discriminator Loss: 0.3961... Generator Loss: 6.2366
Epoch 1/1... Discriminator Loss: 1.2224... Generator Loss: 0.9247
Epoch 1/1... Discriminator Loss: 0.4042... Generator Loss: 4.3636
Epoch 1/1... Discriminator Loss: 0.4024... Generator Loss: 4.4493
Epoch 1/1... Discriminator Loss: 0.3785... Generator Loss: 5.5150
Epoch 1/1... Discriminator Loss: 0.4905... Generator Loss: 2.7027
Epoch 1/1... Discriminator Loss: 1.0392... Generator Loss: 4.9964
Epoch 1/1... Discriminator Loss: 0.4120... Generator Loss: 5.3477
Epoch 1/1... Discriminator Loss: 0.6766... Generator Loss: 1.6509
Epoch 1/1... Discriminator Loss: 0.3967... Generator Loss: 4.5356
Epoch 1/1... Discriminator Loss: 0.5725... Generator Loss: 2.4688
Epoch 1/1... Discriminator Loss: 0.3910... Generator Loss: 5.3944
Epoch 1/1... Discriminator Loss: 0.5232... Generator Loss: 3.3844
Epoch 1/1... Discriminator Loss: 0.7046... Generator Loss: 2.3151
Epoch 1/1... Discriminator Loss: 0.4088... Generator Loss: 5.3200
Epoch 1/1... Discriminator Loss: 0.3894... Generator Loss: 6.4535
Epoch 1/1... Discriminator Loss: 0.3942... Generator Loss: 7.2160
Epoch 1/1... Discriminator Loss: 0.3828... Generator Loss: 4.1762
Epoch 1/1... Discriminator Loss: 0.7171... Generator Loss: 2.1249
Epoch 1/1... Discriminator Loss: 0.6272... Generator Loss: 2.1734
Epoch 1/1... Discriminator Loss: 0.4095... Generator Loss: 7.7297
Epoch 1/1... Discriminator Loss: 0.7301... Generator Loss: 2.5221
Epoch 1/1... Discriminator Loss: 0.4171... Generator Loss: 5.2149
Epoch 1/1... Discriminator Loss: 0.3956... Generator Loss: 7.0958
Epoch 1/1... Discriminator Loss: 0.3662... Generator Loss: 5.3640
Epoch 1/1... Discriminator Loss: 0.4927... Generator Loss: 3.1946
Epoch 1/1... Discriminator Loss: 0.3829... Generator Loss: 5.6020
Epoch 1/1... Discriminator Loss: 0.5713... Generator Loss: 2.0212
Epoch 1/1... Discriminator Loss: 1.6935... Generator Loss: 0.5813
Epoch 1/1... Discriminator Loss: 0.9557... Generator Loss: 3.7333
Epoch 1/1... Discriminator Loss: 0.7979... Generator Loss: 1.5296
Epoch 1/1... Discriminator Loss: 0.3782... Generator Loss: 6.7445
Epoch 1/1... Discriminator Loss: 0.4829... Generator Loss: 5.1041
Epoch 1/1... Discriminator Loss: 0.6906... Generator Loss: 1.7504
Epoch 1/1... Discriminator Loss: 0.6516... Generator Loss: 1.9184
Epoch 1/1... Discriminator Loss: 0.6533... Generator Loss: 2.3782
Epoch 1/1... Discriminator Loss: 0.6577... Generator Loss: 8.0065
Epoch 1/1... Discriminator Loss: 0.8515... Generator Loss: 1.3635
Epoch 1/1... Discriminator Loss: 0.3639... Generator Loss: 7.7529
Epoch 1/1... Discriminator Loss: 0.3751... Generator Loss: 5.8116
Epoch 1/1... Discriminator Loss: 0.8823... Generator Loss: 1.2218
Epoch 1/1... Discriminator Loss: 0.5303... Generator Loss: 3.2569
Epoch 1/1... Discriminator Loss: 0.5149... Generator Loss: 3.4662
Epoch 1/1... Discriminator Loss: 0.5496... Generator Loss: 4.2849
Epoch 1/1... Discriminator Loss: 0.3906... Generator Loss: 5.7089
Epoch 1/1... Discriminator Loss: 0.4232... Generator Loss: 3.3708
Epoch 1/1... Discriminator Loss: 0.4253... Generator Loss: 4.5690
Epoch 1/1... Discriminator Loss: 0.4289... Generator Loss: 6.7716
Epoch 1/1... Discriminator Loss: 0.6932... Generator Loss: 2.1578
Epoch 1/1... Discriminator Loss: 0.5616... Generator Loss: 2.0073
Epoch 1/1... Discriminator Loss: 1.0059... Generator Loss: 5.2616
Epoch 1/1... Discriminator Loss: 0.4274... Generator Loss: 5.1488
Epoch 1/1... Discriminator Loss: 0.4787... Generator Loss: 4.4171
Epoch 1/1... Discriminator Loss: 0.4486... Generator Loss: 3.6015
Epoch 1/1... Discriminator Loss: 0.4550... Generator Loss: 2.9730
Epoch 1/1... Discriminator Loss: 0.7070... Generator Loss: 3.0697
Epoch 1/1... Discriminator Loss: 0.6308... Generator Loss: 2.7734
Epoch 1/1... Discriminator Loss: 0.3557... Generator Loss: 8.3977
Epoch 1/1... Discriminator Loss: 0.5835... Generator Loss: 3.4590
Epoch 1/1... Discriminator Loss: 0.9819... Generator Loss: 1.1209
Epoch 1/1... Discriminator Loss: 0.5219... Generator Loss: 5.7660
Epoch 1/1... Discriminator Loss: 0.3768... Generator Loss: 4.8759
Epoch 1/1... Discriminator Loss: 0.4482... Generator Loss: 5.5700
Epoch 1/1... Discriminator Loss: 0.4864... Generator Loss: 3.7890
Epoch 1/1... Discriminator Loss: 0.7051... Generator Loss: 2.0047
Epoch 1/1... Discriminator Loss: 0.9312... Generator Loss: 1.3067
Epoch 1/1... Discriminator Loss: 0.4785... Generator Loss: 3.9307
Epoch 1/1... Discriminator Loss: 0.7790... Generator Loss: 6.9442
Epoch 1/1... Discriminator Loss: 0.5220... Generator Loss: 3.7092
Epoch 1/1... Discriminator Loss: 1.8000... Generator Loss: 0.4876
Epoch 1/1... Discriminator Loss: 0.7171... Generator Loss: 1.5943
Epoch 1/1... Discriminator Loss: 0.4677... Generator Loss: 5.7007
Epoch 1/1... Discriminator Loss: 1.1960... Generator Loss: 1.0048
Epoch 1/1... Discriminator Loss: 0.4076... Generator Loss: 5.8448
Epoch 1/1... Discriminator Loss: 0.5349... Generator Loss: 3.0245
Epoch 1/1... Discriminator Loss: 0.5200... Generator Loss: 3.0454
Epoch 1/1... Discriminator Loss: 0.8147... Generator Loss: 1.6690
Epoch 1/1... Discriminator Loss: 0.4535... Generator Loss: 5.5009
Epoch 1/1... Discriminator Loss: 0.4011... Generator Loss: 4.1121
Epoch 1/1... Discriminator Loss: 0.4418... Generator Loss: 5.5486
Epoch 1/1... Discriminator Loss: 0.5068... Generator Loss: 3.8170
Epoch 1/1... Discriminator Loss: 0.6072... Generator Loss: 3.4214
Epoch 1/1... Discriminator Loss: 0.5698... Generator Loss: 3.1129
Epoch 1/1... Discriminator Loss: 0.7129... Generator Loss: 1.6740
Epoch 1/1... Discriminator Loss: 0.9341... Generator Loss: 1.2220
Epoch 1/1... Discriminator Loss: 0.5018... Generator Loss: 5.0876
Epoch 1/1... Discriminator Loss: 0.6643... Generator Loss: 2.0720
Epoch 1/1... Discriminator Loss: 3.2119... Generator Loss: 0.1644
Epoch 1/1... Discriminator Loss: 0.4498... Generator Loss: 3.2931
Epoch 1/1... Discriminator Loss: 0.3850... Generator Loss: 4.5606
Epoch 1/1... Discriminator Loss: 0.8322... Generator Loss: 1.3224
Epoch 1/1... Discriminator Loss: 0.4582... Generator Loss: 3.5373
Epoch 1/1... Discriminator Loss: 0.5562... Generator Loss: 2.0901
Epoch 1/1... Discriminator Loss: 0.4249... Generator Loss: 2.8471
Epoch 1/1... Discriminator Loss: 0.4388... Generator Loss: 5.1100
Epoch 1/1... Discriminator Loss: 0.3912... Generator Loss: 6.7266
Training done!
CPU times: user 7min 20s, sys: 1min 32s, total: 8min 53s
Wall time: 11min 49s
In [84]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.5
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 0.5990... Generator Loss: 3.8529
Epoch 1/1... Discriminator Loss: 0.8928... Generator Loss: 1.7190
Epoch 1/1... Discriminator Loss: 2.0425... Generator Loss: 0.4461
Epoch 1/1... Discriminator Loss: 0.3925... Generator Loss: 5.8557
Epoch 1/1... Discriminator Loss: 1.8557... Generator Loss: 7.5328
Epoch 1/1... Discriminator Loss: 1.6664... Generator Loss: 0.6599
Epoch 1/1... Discriminator Loss: 0.4774... Generator Loss: 2.9190
Epoch 1/1... Discriminator Loss: 0.6857... Generator Loss: 5.0907
Epoch 1/1... Discriminator Loss: 0.5486... Generator Loss: 3.0770
Epoch 1/1... Discriminator Loss: 0.6976... Generator Loss: 1.7166
Epoch 1/1... Discriminator Loss: 1.2424... Generator Loss: 0.9356
Epoch 1/1... Discriminator Loss: 2.8143... Generator Loss: 0.1942
Epoch 1/1... Discriminator Loss: 1.7391... Generator Loss: 4.6192
Epoch 1/1... Discriminator Loss: 0.3842... Generator Loss: 5.5219
Epoch 1/1... Discriminator Loss: 1.0299... Generator Loss: 2.4820
Epoch 1/1... Discriminator Loss: 2.1265... Generator Loss: 0.3562
Epoch 1/1... Discriminator Loss: 0.8741... Generator Loss: 1.4556
Epoch 1/1... Discriminator Loss: 2.1168... Generator Loss: 0.2929
Epoch 1/1... Discriminator Loss: 0.6722... Generator Loss: 2.5152
Epoch 1/1... Discriminator Loss: 0.5353... Generator Loss: 3.3736
Epoch 1/1... Discriminator Loss: 0.7100... Generator Loss: 1.6355
Epoch 1/1... Discriminator Loss: 0.9884... Generator Loss: 0.9358
Epoch 1/1... Discriminator Loss: 1.0949... Generator Loss: 0.8689
Epoch 1/1... Discriminator Loss: 0.8397... Generator Loss: 1.2704
Epoch 1/1... Discriminator Loss: 0.6013... Generator Loss: 4.3384
Epoch 1/1... Discriminator Loss: 0.5585... Generator Loss: 3.1118
Epoch 1/1... Discriminator Loss: 1.1628... Generator Loss: 0.9029
Epoch 1/1... Discriminator Loss: 0.4211... Generator Loss: 5.0778
Epoch 1/1... Discriminator Loss: 1.0109... Generator Loss: 1.2585
Epoch 1/1... Discriminator Loss: 0.3655... Generator Loss: 4.8355
Epoch 1/1... Discriminator Loss: 0.3650... Generator Loss: 5.6131
Epoch 1/1... Discriminator Loss: 0.7954... Generator Loss: 2.3791
Epoch 1/1... Discriminator Loss: 0.6282... Generator Loss: 2.0726
Epoch 1/1... Discriminator Loss: 0.6405... Generator Loss: 1.8794
Epoch 1/1... Discriminator Loss: 1.0777... Generator Loss: 0.8264
Epoch 1/1... Discriminator Loss: 0.4557... Generator Loss: 3.4935
Epoch 1/1... Discriminator Loss: 2.7818... Generator Loss: 0.1765
Epoch 1/1... Discriminator Loss: 0.4292... Generator Loss: 4.4969
Epoch 1/1... Discriminator Loss: 0.9469... Generator Loss: 1.0098
Epoch 1/1... Discriminator Loss: 0.6707... Generator Loss: 1.9594
Epoch 1/1... Discriminator Loss: 1.0967... Generator Loss: 0.7587
Epoch 1/1... Discriminator Loss: 0.3763... Generator Loss: 6.2971
Epoch 1/1... Discriminator Loss: 0.3761... Generator Loss: 4.9975
Epoch 1/1... Discriminator Loss: 0.3689... Generator Loss: 4.7907
Epoch 1/1... Discriminator Loss: 0.5799... Generator Loss: 2.4676
Epoch 1/1... Discriminator Loss: 0.6757... Generator Loss: 1.9782
Epoch 1/1... Discriminator Loss: 0.5792... Generator Loss: 4.0857
Epoch 1/1... Discriminator Loss: 0.4477... Generator Loss: 4.5921
Epoch 1/1... Discriminator Loss: 1.1251... Generator Loss: 1.0954
Epoch 1/1... Discriminator Loss: 0.6148... Generator Loss: 1.8338
Epoch 1/1... Discriminator Loss: 1.0513... Generator Loss: 0.9011
Epoch 1/1... Discriminator Loss: 0.6820... Generator Loss: 1.5359
Epoch 1/1... Discriminator Loss: 0.5255... Generator Loss: 2.0133
Epoch 1/1... Discriminator Loss: 0.4434... Generator Loss: 2.6796
Epoch 1/1... Discriminator Loss: 1.2863... Generator Loss: 0.7285
Epoch 1/1... Discriminator Loss: 0.6079... Generator Loss: 2.4793
Epoch 1/1... Discriminator Loss: 0.5554... Generator Loss: 1.9439
Epoch 1/1... Discriminator Loss: 0.6831... Generator Loss: 1.4701
Epoch 1/1... Discriminator Loss: 0.4788... Generator Loss: 2.9173
Epoch 1/1... Discriminator Loss: 0.3933... Generator Loss: 5.3740
Epoch 1/1... Discriminator Loss: 0.8107... Generator Loss: 2.4928
Epoch 1/1... Discriminator Loss: 0.6011... Generator Loss: 2.6949
Epoch 1/1... Discriminator Loss: 1.0225... Generator Loss: 1.0028
Epoch 1/1... Discriminator Loss: 1.0588... Generator Loss: 0.9075
Epoch 1/1... Discriminator Loss: 1.4318... Generator Loss: 0.5130
Epoch 1/1... Discriminator Loss: 0.5463... Generator Loss: 2.1189
Epoch 1/1... Discriminator Loss: 1.8262... Generator Loss: 0.3724
Epoch 1/1... Discriminator Loss: 0.6910... Generator Loss: 1.4826
Epoch 1/1... Discriminator Loss: 1.0605... Generator Loss: 0.9140
Epoch 1/1... Discriminator Loss: 0.3733... Generator Loss: 4.8650
Epoch 1/1... Discriminator Loss: 1.0475... Generator Loss: 0.8377
Epoch 1/1... Discriminator Loss: 0.5449... Generator Loss: 3.4212
Epoch 1/1... Discriminator Loss: 0.9678... Generator Loss: 0.9916
Epoch 1/1... Discriminator Loss: 0.8959... Generator Loss: 1.0631
Epoch 1/1... Discriminator Loss: 0.4723... Generator Loss: 2.7168
Epoch 1/1... Discriminator Loss: 0.5515... Generator Loss: 3.1926
Epoch 1/1... Discriminator Loss: 0.4837... Generator Loss: 2.9303
Epoch 1/1... Discriminator Loss: 0.8432... Generator Loss: 3.6148
Epoch 1/1... Discriminator Loss: 0.4047... Generator Loss: 5.4992
Epoch 1/1... Discriminator Loss: 0.6249... Generator Loss: 2.7519
Epoch 1/1... Discriminator Loss: 0.5698... Generator Loss: 1.7559
Epoch 1/1... Discriminator Loss: 0.4614... Generator Loss: 2.8161
Epoch 1/1... Discriminator Loss: 1.0604... Generator Loss: 1.0290
Epoch 1/1... Discriminator Loss: 1.2757... Generator Loss: 0.8015
Epoch 1/1... Discriminator Loss: 0.5902... Generator Loss: 2.5605
Epoch 1/1... Discriminator Loss: 2.8980... Generator Loss: 10.5952
Epoch 1/1... Discriminator Loss: 0.5684... Generator Loss: 7.4394
Epoch 1/1... Discriminator Loss: 1.6784... Generator Loss: 0.6104
Epoch 1/1... Discriminator Loss: 0.4676... Generator Loss: 6.5261
Epoch 1/1... Discriminator Loss: 0.5112... Generator Loss: 6.3422
Epoch 1/1... Discriminator Loss: 0.4279... Generator Loss: 6.5485
Epoch 1/1... Discriminator Loss: 0.8476... Generator Loss: 1.4958
Epoch 1/1... Discriminator Loss: 0.5165... Generator Loss: 2.1609
Epoch 1/1... Discriminator Loss: 0.7027... Generator Loss: 1.5861
Epoch 1/1... Discriminator Loss: 0.6693... Generator Loss: 1.5841
Epoch 1/1... Discriminator Loss: 1.3624... Generator Loss: 0.7003
Epoch 1/1... Discriminator Loss: 0.8400... Generator Loss: 1.1413
Epoch 1/1... Discriminator Loss: 0.9512... Generator Loss: 2.8377
Epoch 1/1... Discriminator Loss: 0.5769... Generator Loss: 2.0467
Epoch 1/1... Discriminator Loss: 1.0297... Generator Loss: 1.0226
Epoch 1/1... Discriminator Loss: 0.5297... Generator Loss: 2.9496
Epoch 1/1... Discriminator Loss: 1.1930... Generator Loss: 1.3960
Epoch 1/1... Discriminator Loss: 0.7732... Generator Loss: 1.3124
Training done!
CPU times: user 6min 52s, sys: 1min 26s, total: 8min 18s
Wall time: 10min 54s

Change beta1 and discriminator kernel size

When I reduce kernel size of different layers in discriminator, it gives me a nice face famework, but more more noise, like a block of block. Then I recuce beta1 parameter, and the face looks more smooth and some noise disappears.

In [89]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.5
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
# disc first layer kernel size change to 3
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 0.6993... Generator Loss: 8.4567
Epoch 1/1... Discriminator Loss: 0.6210... Generator Loss: 3.1975
Epoch 1/1... Discriminator Loss: 0.7355... Generator Loss: 3.1621
Epoch 1/1... Discriminator Loss: 1.1517... Generator Loss: 7.7283
Epoch 1/1... Discriminator Loss: 0.9149... Generator Loss: 8.8806
Epoch 1/1... Discriminator Loss: 0.3880... Generator Loss: 3.7736
Epoch 1/1... Discriminator Loss: 0.6648... Generator Loss: 6.1504
Epoch 1/1... Discriminator Loss: 2.8179... Generator Loss: 0.1655
Epoch 1/1... Discriminator Loss: 0.4049... Generator Loss: 4.7252
Epoch 1/1... Discriminator Loss: 3.1984... Generator Loss: 0.1849
Epoch 1/1... Discriminator Loss: 0.4044... Generator Loss: 5.0243
Epoch 1/1... Discriminator Loss: 1.3595... Generator Loss: 0.7706
Epoch 1/1... Discriminator Loss: 0.6280... Generator Loss: 2.5756
Epoch 1/1... Discriminator Loss: 2.1970... Generator Loss: 0.2964
Epoch 1/1... Discriminator Loss: 0.8580... Generator Loss: 1.7531
Epoch 1/1... Discriminator Loss: 2.4885... Generator Loss: 0.2688
Epoch 1/1... Discriminator Loss: 0.5557... Generator Loss: 3.7609
Epoch 1/1... Discriminator Loss: 1.1092... Generator Loss: 1.1672
Epoch 1/1... Discriminator Loss: 0.4669... Generator Loss: 2.5949
Epoch 1/1... Discriminator Loss: 2.5602... Generator Loss: 0.2144
Epoch 1/1... Discriminator Loss: 0.7667... Generator Loss: 1.5712
Epoch 1/1... Discriminator Loss: 0.6985... Generator Loss: 1.4602
Epoch 1/1... Discriminator Loss: 0.5552... Generator Loss: 2.1630
Epoch 1/1... Discriminator Loss: 0.6220... Generator Loss: 1.7912
Epoch 1/1... Discriminator Loss: 0.5959... Generator Loss: 2.5028
Epoch 1/1... Discriminator Loss: 1.5538... Generator Loss: 0.5497
Epoch 1/1... Discriminator Loss: 0.8520... Generator Loss: 1.2269
Epoch 1/1... Discriminator Loss: 0.7321... Generator Loss: 1.6405
Epoch 1/1... Discriminator Loss: 2.5454... Generator Loss: 0.1698
Epoch 1/1... Discriminator Loss: 0.3961... Generator Loss: 5.8641
Epoch 1/1... Discriminator Loss: 0.9611... Generator Loss: 1.0843
Epoch 1/1... Discriminator Loss: 0.5320... Generator Loss: 2.5398
Epoch 1/1... Discriminator Loss: 1.3926... Generator Loss: 7.0084
Epoch 1/1... Discriminator Loss: 0.5179... Generator Loss: 2.2477
Epoch 1/1... Discriminator Loss: 0.8575... Generator Loss: 3.2102
Epoch 1/1... Discriminator Loss: 0.8956... Generator Loss: 1.1683
Epoch 1/1... Discriminator Loss: 0.5464... Generator Loss: 2.6130
Epoch 1/1... Discriminator Loss: 0.6442... Generator Loss: 1.9219
Epoch 1/1... Discriminator Loss: 0.4522... Generator Loss: 3.3397
Epoch 1/1... Discriminator Loss: 1.1756... Generator Loss: 3.4593
Epoch 1/1... Discriminator Loss: 0.6257... Generator Loss: 2.1334
Epoch 1/1... Discriminator Loss: 0.5850... Generator Loss: 2.0063
Epoch 1/1... Discriminator Loss: 0.4223... Generator Loss: 3.3663
Epoch 1/1... Discriminator Loss: 1.4481... Generator Loss: 0.5963
Epoch 1/1... Discriminator Loss: 0.4816... Generator Loss: 3.2009
Epoch 1/1... Discriminator Loss: 0.4597... Generator Loss: 3.1338
Epoch 1/1... Discriminator Loss: 0.4556... Generator Loss: 2.9484
Epoch 1/1... Discriminator Loss: 1.4602... Generator Loss: 0.7609
Epoch 1/1... Discriminator Loss: 0.7675... Generator Loss: 1.4009
Epoch 1/1... Discriminator Loss: 0.8211... Generator Loss: 1.7412
Epoch 1/1... Discriminator Loss: 0.5141... Generator Loss: 3.3174
Epoch 1/1... Discriminator Loss: 1.1638... Generator Loss: 0.9873
Epoch 1/1... Discriminator Loss: 0.7590... Generator Loss: 1.8460
Epoch 1/1... Discriminator Loss: 0.3850... Generator Loss: 6.3314
Epoch 1/1... Discriminator Loss: 0.6485... Generator Loss: 2.2195
Epoch 1/1... Discriminator Loss: 0.6673... Generator Loss: 1.7318
Epoch 1/1... Discriminator Loss: 0.5393... Generator Loss: 2.3916
Epoch 1/1... Discriminator Loss: 0.4621... Generator Loss: 4.0631
Epoch 1/1... Discriminator Loss: 0.4891... Generator Loss: 3.5460
Epoch 1/1... Discriminator Loss: 0.6634... Generator Loss: 1.8944
Epoch 1/1... Discriminator Loss: 0.5611... Generator Loss: 1.9651
Epoch 1/1... Discriminator Loss: 0.4601... Generator Loss: 3.4720
Epoch 1/1... Discriminator Loss: 2.9127... Generator Loss: 0.1577
Epoch 1/1... Discriminator Loss: 0.4337... Generator Loss: 3.0176
Epoch 1/1... Discriminator Loss: 1.0331... Generator Loss: 1.0404
Epoch 1/1... Discriminator Loss: 0.7068... Generator Loss: 2.0858
Epoch 1/1... Discriminator Loss: 1.1413... Generator Loss: 1.1075
Epoch 1/1... Discriminator Loss: 1.1985... Generator Loss: 0.8528
Epoch 1/1... Discriminator Loss: 0.6642... Generator Loss: 2.5338
Epoch 1/1... Discriminator Loss: 0.4704... Generator Loss: 3.0791
Epoch 1/1... Discriminator Loss: 0.4804... Generator Loss: 3.2710
Epoch 1/1... Discriminator Loss: 0.4376... Generator Loss: 2.8540
Epoch 1/1... Discriminator Loss: 0.3887... Generator Loss: 4.6421
Epoch 1/1... Discriminator Loss: 0.5356... Generator Loss: 2.6840
Epoch 1/1... Discriminator Loss: 0.6721... Generator Loss: 1.8867
Epoch 1/1... Discriminator Loss: 0.7105... Generator Loss: 1.9577
Epoch 1/1... Discriminator Loss: 1.2095... Generator Loss: 1.7955
Epoch 1/1... Discriminator Loss: 0.4905... Generator Loss: 2.5793
Epoch 1/1... Discriminator Loss: 1.3192... Generator Loss: 0.6881
Epoch 1/1... Discriminator Loss: 0.4799... Generator Loss: 2.6487
Epoch 1/1... Discriminator Loss: 1.5092... Generator Loss: 4.5250
Epoch 1/1... Discriminator Loss: 0.9018... Generator Loss: 2.0190
Epoch 1/1... Discriminator Loss: 0.4768... Generator Loss: 2.7269
Epoch 1/1... Discriminator Loss: 0.4123... Generator Loss: 3.2151
Epoch 1/1... Discriminator Loss: 0.4369... Generator Loss: 3.2919
Epoch 1/1... Discriminator Loss: 0.4778... Generator Loss: 3.5210
Epoch 1/1... Discriminator Loss: 0.5526... Generator Loss: 2.1172
Epoch 1/1... Discriminator Loss: 0.3765... Generator Loss: 5.7917
Epoch 1/1... Discriminator Loss: 1.1081... Generator Loss: 0.8908
Epoch 1/1... Discriminator Loss: 0.6958... Generator Loss: 2.0192
Epoch 1/1... Discriminator Loss: 0.8958... Generator Loss: 2.3483
Epoch 1/1... Discriminator Loss: 0.8360... Generator Loss: 1.6979
Epoch 1/1... Discriminator Loss: 0.4999... Generator Loss: 2.3893
Epoch 1/1... Discriminator Loss: 0.6406... Generator Loss: 2.0124
Epoch 1/1... Discriminator Loss: 1.5984... Generator Loss: 0.7462
Epoch 1/1... Discriminator Loss: 1.1494... Generator Loss: 0.8997
Epoch 1/1... Discriminator Loss: 0.8157... Generator Loss: 1.5911
Epoch 1/1... Discriminator Loss: 1.2896... Generator Loss: 0.7471
Epoch 1/1... Discriminator Loss: 0.7982... Generator Loss: 3.9114
Epoch 1/1... Discriminator Loss: 0.9528... Generator Loss: 2.1706
Epoch 1/1... Discriminator Loss: 0.7791... Generator Loss: 1.9169
Epoch 1/1... Discriminator Loss: 2.2377... Generator Loss: 4.3730
Epoch 1/1... Discriminator Loss: 0.7985... Generator Loss: 2.2687
Training done!
CPU times: user 6min 49s, sys: 1min 25s, total: 8min 15s
Wall time: 10min 50s
In [103]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.4
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
# disc first layer kernel size change to 3
# disc second layer kernel size change to 3
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 0.8904... Generator Loss: 2.8024
Epoch 1/1... Discriminator Loss: 2.7180... Generator Loss: 5.0313
Epoch 1/1... Discriminator Loss: 2.1505... Generator Loss: 0.2707
Epoch 1/1... Discriminator Loss: 1.0301... Generator Loss: 1.7263
Epoch 1/1... Discriminator Loss: 0.7685... Generator Loss: 2.2790
Epoch 1/1... Discriminator Loss: 0.4518... Generator Loss: 3.8424
Epoch 1/1... Discriminator Loss: 0.4784... Generator Loss: 6.5366
Epoch 1/1... Discriminator Loss: 0.4383... Generator Loss: 3.2264
Epoch 1/1... Discriminator Loss: 0.5067... Generator Loss: 2.1172
Epoch 1/1... Discriminator Loss: 5.8299... Generator Loss: 9.9436
Epoch 1/1... Discriminator Loss: 0.8409... Generator Loss: 1.5934
Epoch 1/1... Discriminator Loss: 0.3577... Generator Loss: 6.6214
Epoch 1/1... Discriminator Loss: 0.6552... Generator Loss: 1.8315
Epoch 1/1... Discriminator Loss: 0.5750... Generator Loss: 3.3235
Epoch 1/1... Discriminator Loss: 0.4470... Generator Loss: 5.9432
Epoch 1/1... Discriminator Loss: 0.3954... Generator Loss: 4.7493
Epoch 1/1... Discriminator Loss: 1.5093... Generator Loss: 0.7575
Epoch 1/1... Discriminator Loss: 0.6061... Generator Loss: 2.1588
Epoch 1/1... Discriminator Loss: 0.5176... Generator Loss: 3.2229
Epoch 1/1... Discriminator Loss: 2.0482... Generator Loss: 0.3071
Epoch 1/1... Discriminator Loss: 0.5366... Generator Loss: 3.8122
Epoch 1/1... Discriminator Loss: 0.6289... Generator Loss: 1.9049
Epoch 1/1... Discriminator Loss: 1.5434... Generator Loss: 0.5002
Epoch 1/1... Discriminator Loss: 0.6820... Generator Loss: 3.6354
Epoch 1/1... Discriminator Loss: 0.6985... Generator Loss: 1.4666
Epoch 1/1... Discriminator Loss: 1.1686... Generator Loss: 0.7262
Epoch 1/1... Discriminator Loss: 1.4825... Generator Loss: 0.5531
Epoch 1/1... Discriminator Loss: 0.6679... Generator Loss: 2.0838
Epoch 1/1... Discriminator Loss: 2.9070... Generator Loss: 0.1235
Epoch 1/1... Discriminator Loss: 1.5050... Generator Loss: 4.5687
Epoch 1/1... Discriminator Loss: 0.3933... Generator Loss: 4.7158
Epoch 1/1... Discriminator Loss: 0.3900... Generator Loss: 6.8282
Epoch 1/1... Discriminator Loss: 0.4725... Generator Loss: 3.2029
Epoch 1/1... Discriminator Loss: 1.6450... Generator Loss: 0.4739
Epoch 1/1... Discriminator Loss: 0.8696... Generator Loss: 1.5319
Epoch 1/1... Discriminator Loss: 0.6861... Generator Loss: 5.1660
Epoch 1/1... Discriminator Loss: 0.4692... Generator Loss: 2.9719
Epoch 1/1... Discriminator Loss: 1.2579... Generator Loss: 0.8050
Epoch 1/1... Discriminator Loss: 0.6851... Generator Loss: 1.6854
Epoch 1/1... Discriminator Loss: 1.1432... Generator Loss: 0.8720
Epoch 1/1... Discriminator Loss: 0.8412... Generator Loss: 1.1876
Epoch 1/1... Discriminator Loss: 0.5383... Generator Loss: 2.5555
Epoch 1/1... Discriminator Loss: 0.4229... Generator Loss: 3.0878
Epoch 1/1... Discriminator Loss: 0.6210... Generator Loss: 2.0092
Epoch 1/1... Discriminator Loss: 0.6158... Generator Loss: 2.3593
Epoch 1/1... Discriminator Loss: 0.5367... Generator Loss: 2.9469
Epoch 1/1... Discriminator Loss: 0.8840... Generator Loss: 1.2175
Epoch 1/1... Discriminator Loss: 0.6549... Generator Loss: 2.7551
Epoch 1/1... Discriminator Loss: 2.0132... Generator Loss: 0.2913
Epoch 1/1... Discriminator Loss: 0.7320... Generator Loss: 1.5639
Epoch 1/1... Discriminator Loss: 0.4387... Generator Loss: 2.8974
Epoch 1/1... Discriminator Loss: 0.6090... Generator Loss: 1.9878
Epoch 1/1... Discriminator Loss: 0.9193... Generator Loss: 1.0325
Epoch 1/1... Discriminator Loss: 0.5772... Generator Loss: 1.9403
Epoch 1/1... Discriminator Loss: 2.5882... Generator Loss: 0.2216
Epoch 1/1... Discriminator Loss: 0.4773... Generator Loss: 3.6897
Epoch 1/1... Discriminator Loss: 0.9678... Generator Loss: 0.9727
Epoch 1/1... Discriminator Loss: 0.4508... Generator Loss: 3.5039
Epoch 1/1... Discriminator Loss: 0.7989... Generator Loss: 1.3945
Epoch 1/1... Discriminator Loss: 0.4570... Generator Loss: 2.8208
Epoch 1/1... Discriminator Loss: 0.4422... Generator Loss: 3.7952
Epoch 1/1... Discriminator Loss: 0.4805... Generator Loss: 2.9003
Epoch 1/1... Discriminator Loss: 0.9478... Generator Loss: 1.1932
Epoch 1/1... Discriminator Loss: 0.4775... Generator Loss: 3.6193
Epoch 1/1... Discriminator Loss: 0.7551... Generator Loss: 1.4507
Epoch 1/1... Discriminator Loss: 0.9701... Generator Loss: 1.0194
Epoch 1/1... Discriminator Loss: 0.9965... Generator Loss: 0.9028
Epoch 1/1... Discriminator Loss: 1.0476... Generator Loss: 2.1155
Epoch 1/1... Discriminator Loss: 0.5647... Generator Loss: 2.4035
Epoch 1/1... Discriminator Loss: 0.5671... Generator Loss: 2.3898
Epoch 1/1... Discriminator Loss: 0.4744... Generator Loss: 3.6577
Epoch 1/1... Discriminator Loss: 1.2085... Generator Loss: 0.8000
Epoch 1/1... Discriminator Loss: 0.5495... Generator Loss: 2.5931
Epoch 1/1... Discriminator Loss: 0.6071... Generator Loss: 1.9049
Epoch 1/1... Discriminator Loss: 0.4801... Generator Loss: 3.1753
Epoch 1/1... Discriminator Loss: 1.9662... Generator Loss: 3.2876
Epoch 1/1... Discriminator Loss: 0.6050... Generator Loss: 1.9674
Epoch 1/1... Discriminator Loss: 0.6934... Generator Loss: 1.7443
Epoch 1/1... Discriminator Loss: 1.3672... Generator Loss: 0.6257
Epoch 1/1... Discriminator Loss: 0.7060... Generator Loss: 1.6486
Epoch 1/1... Discriminator Loss: 1.3853... Generator Loss: 0.8995
Epoch 1/1... Discriminator Loss: 1.3530... Generator Loss: 1.8618
Epoch 1/1... Discriminator Loss: 1.4333... Generator Loss: 0.6038
Epoch 1/1... Discriminator Loss: 0.9789... Generator Loss: 0.9225
Epoch 1/1... Discriminator Loss: 0.7244... Generator Loss: 1.4876
Epoch 1/1... Discriminator Loss: 1.1032... Generator Loss: 0.7824
Epoch 1/1... Discriminator Loss: 0.6946... Generator Loss: 1.7973
Epoch 1/1... Discriminator Loss: 1.4935... Generator Loss: 0.5299
Epoch 1/1... Discriminator Loss: 1.1650... Generator Loss: 3.0671
Epoch 1/1... Discriminator Loss: 0.9005... Generator Loss: 3.5948
Epoch 1/1... Discriminator Loss: 0.7485... Generator Loss: 1.7751
Epoch 1/1... Discriminator Loss: 0.8151... Generator Loss: 2.0498
Epoch 1/1... Discriminator Loss: 0.8304... Generator Loss: 1.4156
Epoch 1/1... Discriminator Loss: 0.7429... Generator Loss: 1.4593
Epoch 1/1... Discriminator Loss: 1.1365... Generator Loss: 2.4038
Epoch 1/1... Discriminator Loss: 2.4868... Generator Loss: 0.1946
Epoch 1/1... Discriminator Loss: 1.1497... Generator Loss: 0.8764
Epoch 1/1... Discriminator Loss: 0.5916... Generator Loss: 2.8317
Epoch 1/1... Discriminator Loss: 0.9483... Generator Loss: 1.5832
Epoch 1/1... Discriminator Loss: 1.5456... Generator Loss: 3.5551
Epoch 1/1... Discriminator Loss: 1.1268... Generator Loss: 0.8910
Epoch 1/1... Discriminator Loss: 0.9834... Generator Loss: 1.0340
Epoch 1/1... Discriminator Loss: 0.9218... Generator Loss: 2.8740
Training done!
CPU times: user 6min 15s, sys: 1min 12s, total: 7min 28s
Wall time: 9min 28s
In [104]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.3
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
# disc first layer kernel size change to 3
# disc second layer kernel size change to 3
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 2.5411... Generator Loss: 0.5604
Epoch 1/1... Discriminator Loss: 0.7728... Generator Loss: 5.5849
Epoch 1/1... Discriminator Loss: 1.2916... Generator Loss: 9.0802
Epoch 1/1... Discriminator Loss: 0.7443... Generator Loss: 2.8723
Epoch 1/1... Discriminator Loss: 0.3892... Generator Loss: 5.0928
Epoch 1/1... Discriminator Loss: 0.5545... Generator Loss: 2.6336
Epoch 1/1... Discriminator Loss: 0.6927... Generator Loss: 1.6978
Epoch 1/1... Discriminator Loss: 1.0358... Generator Loss: 1.2420
Epoch 1/1... Discriminator Loss: 0.4157... Generator Loss: 7.1250
Epoch 1/1... Discriminator Loss: 1.0114... Generator Loss: 1.2520
Epoch 1/1... Discriminator Loss: 0.8008... Generator Loss: 1.3408
Epoch 1/1... Discriminator Loss: 0.4284... Generator Loss: 4.0730
Epoch 1/1... Discriminator Loss: 0.4527... Generator Loss: 4.4925
Epoch 1/1... Discriminator Loss: 1.5015... Generator Loss: 0.5845
Epoch 1/1... Discriminator Loss: 2.1795... Generator Loss: 0.3270
Epoch 1/1... Discriminator Loss: 1.9891... Generator Loss: 0.3219
Epoch 1/1... Discriminator Loss: 1.4483... Generator Loss: 3.4009
Epoch 1/1... Discriminator Loss: 0.7052... Generator Loss: 2.5069
Epoch 1/1... Discriminator Loss: 2.1022... Generator Loss: 0.2977
Epoch 1/1... Discriminator Loss: 2.3081... Generator Loss: 0.2532
Epoch 1/1... Discriminator Loss: 0.6036... Generator Loss: 2.4957
Epoch 1/1... Discriminator Loss: 0.9086... Generator Loss: 4.9687
Epoch 1/1... Discriminator Loss: 0.6196... Generator Loss: 2.0954
Epoch 1/1... Discriminator Loss: 0.7694... Generator Loss: 3.7294
Epoch 1/1... Discriminator Loss: 0.5278... Generator Loss: 2.8601
Epoch 1/1... Discriminator Loss: 0.6337... Generator Loss: 1.8999
Epoch 1/1... Discriminator Loss: 0.4785... Generator Loss: 6.5836
Epoch 1/1... Discriminator Loss: 0.7566... Generator Loss: 1.5844
Epoch 1/1... Discriminator Loss: 0.8381... Generator Loss: 1.5170
Epoch 1/1... Discriminator Loss: 1.5691... Generator Loss: 0.8709
Epoch 1/1... Discriminator Loss: 0.4140... Generator Loss: 3.8370
Epoch 1/1... Discriminator Loss: 3.3284... Generator Loss: 0.1288
Epoch 1/1... Discriminator Loss: 1.1256... Generator Loss: 3.7383
Epoch 1/1... Discriminator Loss: 0.4380... Generator Loss: 5.0228
Epoch 1/1... Discriminator Loss: 0.3840... Generator Loss: 4.7656
Epoch 1/1... Discriminator Loss: 0.5426... Generator Loss: 4.9375
Epoch 1/1... Discriminator Loss: 1.3368... Generator Loss: 0.6429
Epoch 1/1... Discriminator Loss: 0.4582... Generator Loss: 3.3509
Epoch 1/1... Discriminator Loss: 0.4586... Generator Loss: 5.0470
Epoch 1/1... Discriminator Loss: 1.2039... Generator Loss: 0.7203
Epoch 1/1... Discriminator Loss: 1.4094... Generator Loss: 0.5788
Epoch 1/1... Discriminator Loss: 0.7563... Generator Loss: 2.1473
Epoch 1/1... Discriminator Loss: 0.7359... Generator Loss: 1.5540
Epoch 1/1... Discriminator Loss: 0.7169... Generator Loss: 1.2692
Epoch 1/1... Discriminator Loss: 0.4900... Generator Loss: 2.3644
Epoch 1/1... Discriminator Loss: 0.8548... Generator Loss: 1.6155
Epoch 1/1... Discriminator Loss: 1.6893... Generator Loss: 0.4565
Epoch 1/1... Discriminator Loss: 1.4249... Generator Loss: 0.5486
Epoch 1/1... Discriminator Loss: 1.1378... Generator Loss: 0.6996
Epoch 1/1... Discriminator Loss: 0.7824... Generator Loss: 1.2819
Epoch 1/1... Discriminator Loss: 0.4517... Generator Loss: 3.3226
Epoch 1/1... Discriminator Loss: 0.5350... Generator Loss: 4.5733
Epoch 1/1... Discriminator Loss: 0.8519... Generator Loss: 1.2234
Epoch 1/1... Discriminator Loss: 0.9202... Generator Loss: 1.1870
Epoch 1/1... Discriminator Loss: 0.5233... Generator Loss: 2.2542
Epoch 1/1... Discriminator Loss: 0.4703... Generator Loss: 3.0370
Epoch 1/1... Discriminator Loss: 0.6519... Generator Loss: 1.8044
Epoch 1/1... Discriminator Loss: 0.6630... Generator Loss: 1.8176
Epoch 1/1... Discriminator Loss: 1.4280... Generator Loss: 0.8146
Epoch 1/1... Discriminator Loss: 0.5454... Generator Loss: 2.0997
Epoch 1/1... Discriminator Loss: 0.4029... Generator Loss: 4.9114
Epoch 1/1... Discriminator Loss: 0.7127... Generator Loss: 2.5260
Epoch 1/1... Discriminator Loss: 1.2283... Generator Loss: 1.3600
Epoch 1/1... Discriminator Loss: 1.2289... Generator Loss: 0.8097
Epoch 1/1... Discriminator Loss: 0.4384... Generator Loss: 2.9991
Epoch 1/1... Discriminator Loss: 0.4654... Generator Loss: 3.2048
Epoch 1/1... Discriminator Loss: 0.9822... Generator Loss: 1.2835
Epoch 1/1... Discriminator Loss: 0.4638... Generator Loss: 2.4240
Epoch 1/1... Discriminator Loss: 0.5461... Generator Loss: 2.4162
Epoch 1/1... Discriminator Loss: 0.4600... Generator Loss: 4.5232
Epoch 1/1... Discriminator Loss: 0.8469... Generator Loss: 1.3951
Epoch 1/1... Discriminator Loss: 1.3877... Generator Loss: 0.5952
Epoch 1/1... Discriminator Loss: 1.6159... Generator Loss: 0.5157
Epoch 1/1... Discriminator Loss: 0.6402... Generator Loss: 1.8498
Epoch 1/1... Discriminator Loss: 1.5105... Generator Loss: 0.4844
Epoch 1/1... Discriminator Loss: 0.5973... Generator Loss: 1.8584
Epoch 1/1... Discriminator Loss: 0.4275... Generator Loss: 4.0734
Epoch 1/1... Discriminator Loss: 2.1672... Generator Loss: 0.2831
Epoch 1/1... Discriminator Loss: 0.3735... Generator Loss: 4.5974
Epoch 1/1... Discriminator Loss: 0.4539... Generator Loss: 2.9522
Epoch 1/1... Discriminator Loss: 0.9004... Generator Loss: 3.2501
Epoch 1/1... Discriminator Loss: 1.6015... Generator Loss: 3.3878
Epoch 1/1... Discriminator Loss: 1.0201... Generator Loss: 2.3596
Epoch 1/1... Discriminator Loss: 0.5044... Generator Loss: 3.7844
Epoch 1/1... Discriminator Loss: 0.4383... Generator Loss: 2.9402
Epoch 1/1... Discriminator Loss: 1.0207... Generator Loss: 2.2380
Epoch 1/1... Discriminator Loss: 1.6871... Generator Loss: 0.4841
Epoch 1/1... Discriminator Loss: 0.8329... Generator Loss: 1.7717
Epoch 1/1... Discriminator Loss: 0.3746... Generator Loss: 4.5334
Epoch 1/1... Discriminator Loss: 0.5774... Generator Loss: 4.1303
Epoch 1/1... Discriminator Loss: 1.3398... Generator Loss: 3.2077
Epoch 1/1... Discriminator Loss: 0.4299... Generator Loss: 3.7866
Epoch 1/1... Discriminator Loss: 1.7881... Generator Loss: 0.4377
Epoch 1/1... Discriminator Loss: 0.8604... Generator Loss: 1.1741
Epoch 1/1... Discriminator Loss: 0.4695... Generator Loss: 3.7016
Epoch 1/1... Discriminator Loss: 1.5392... Generator Loss: 0.4957
Epoch 1/1... Discriminator Loss: 1.0594... Generator Loss: 0.8615
Epoch 1/1... Discriminator Loss: 1.2382... Generator Loss: 4.3030
Epoch 1/1... Discriminator Loss: 0.4306... Generator Loss: 3.1148
Epoch 1/1... Discriminator Loss: 0.9918... Generator Loss: 0.9467
Epoch 1/1... Discriminator Loss: 0.7691... Generator Loss: 1.4975
Epoch 1/1... Discriminator Loss: 1.3969... Generator Loss: 0.6941
Epoch 1/1... Discriminator Loss: 1.1189... Generator Loss: 0.9554
Training done!
CPU times: user 6min 16s, sys: 1min 11s, total: 7min 28s
Wall time: 9min 28s
In [118]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.3
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
# disc first layer kernel size change to 3
# disc second layer kernel size change to 3
# disc final layer kernel size change to 3

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 3.2268... Generator Loss: 0.2582
Epoch 1/1... Discriminator Loss: 0.6182... Generator Loss: 2.5168
Epoch 1/1... Discriminator Loss: 0.5189... Generator Loss: 2.7944
Epoch 1/1... Discriminator Loss: 0.6170... Generator Loss: 1.6856
Epoch 1/1... Discriminator Loss: 0.3726... Generator Loss: 6.3237
Epoch 1/1... Discriminator Loss: 0.4636... Generator Loss: 3.2241
Epoch 1/1... Discriminator Loss: 0.5708... Generator Loss: 2.4788
Epoch 1/1... Discriminator Loss: 0.4202... Generator Loss: 7.0118
Epoch 1/1... Discriminator Loss: 0.4654... Generator Loss: 3.1808
Epoch 1/1... Discriminator Loss: 0.3869... Generator Loss: 9.4753
Epoch 1/1... Discriminator Loss: 0.4839... Generator Loss: 3.2057
Epoch 1/1... Discriminator Loss: 0.4382... Generator Loss: 4.6917
Epoch 1/1... Discriminator Loss: 0.6185... Generator Loss: 2.2420
Epoch 1/1... Discriminator Loss: 1.0619... Generator Loss: 0.9934
Epoch 1/1... Discriminator Loss: 1.4561... Generator Loss: 10.3492
Epoch 1/1... Discriminator Loss: 0.3874... Generator Loss: 6.4337
Epoch 1/1... Discriminator Loss: 1.0117... Generator Loss: 5.1849
Epoch 1/1... Discriminator Loss: 0.6730... Generator Loss: 4.5258
Epoch 1/1... Discriminator Loss: 0.5426... Generator Loss: 2.7932
Epoch 1/1... Discriminator Loss: 1.4612... Generator Loss: 0.5216
Epoch 1/1... Discriminator Loss: 1.6110... Generator Loss: 0.4947
Epoch 1/1... Discriminator Loss: 1.9289... Generator Loss: 0.3716
Epoch 1/1... Discriminator Loss: 0.8012... Generator Loss: 1.3973
Epoch 1/1... Discriminator Loss: 1.0174... Generator Loss: 1.2285
Epoch 1/1... Discriminator Loss: 1.1604... Generator Loss: 0.8236
Epoch 1/1... Discriminator Loss: 1.5054... Generator Loss: 0.6136
Epoch 1/1... Discriminator Loss: 1.6523... Generator Loss: 0.4733
Epoch 1/1... Discriminator Loss: 0.4326... Generator Loss: 4.4824
Epoch 1/1... Discriminator Loss: 0.4866... Generator Loss: 2.6031
Epoch 1/1... Discriminator Loss: 0.5518... Generator Loss: 5.2136
Epoch 1/1... Discriminator Loss: 0.4701... Generator Loss: 3.3867
Epoch 1/1... Discriminator Loss: 1.1014... Generator Loss: 4.1296
Epoch 1/1... Discriminator Loss: 0.5850... Generator Loss: 3.3713
Epoch 1/1... Discriminator Loss: 1.1562... Generator Loss: 0.8134
Epoch 1/1... Discriminator Loss: 1.6846... Generator Loss: 0.4243
Epoch 1/1... Discriminator Loss: 0.5832... Generator Loss: 1.8991
Epoch 1/1... Discriminator Loss: 0.4916... Generator Loss: 2.9643
Epoch 1/1... Discriminator Loss: 1.1838... Generator Loss: 0.8856
Epoch 1/1... Discriminator Loss: 1.3338... Generator Loss: 0.8420
Epoch 1/1... Discriminator Loss: 1.1195... Generator Loss: 1.0567
Epoch 1/1... Discriminator Loss: 1.0975... Generator Loss: 1.0553
Epoch 1/1... Discriminator Loss: 0.4244... Generator Loss: 3.8814
Epoch 1/1... Discriminator Loss: 0.8414... Generator Loss: 4.7604
Epoch 1/1... Discriminator Loss: 0.5638... Generator Loss: 2.4182
Epoch 1/1... Discriminator Loss: 0.5272... Generator Loss: 2.3745
Epoch 1/1... Discriminator Loss: 0.5190... Generator Loss: 2.9802
Epoch 1/1... Discriminator Loss: 0.4277... Generator Loss: 3.5676
Epoch 1/1... Discriminator Loss: 0.5033... Generator Loss: 3.8148
Epoch 1/1... Discriminator Loss: 0.4342... Generator Loss: 3.6383
Epoch 1/1... Discriminator Loss: 0.5534... Generator Loss: 2.8880
Epoch 1/1... Discriminator Loss: 0.6534... Generator Loss: 2.6082
Epoch 1/1... Discriminator Loss: 0.5323... Generator Loss: 2.7912
Epoch 1/1... Discriminator Loss: 0.7394... Generator Loss: 1.4110
Epoch 1/1... Discriminator Loss: 0.5415... Generator Loss: 2.5954
Epoch 1/1... Discriminator Loss: 0.9655... Generator Loss: 3.4504
Epoch 1/1... Discriminator Loss: 0.5636... Generator Loss: 2.8813
Epoch 1/1... Discriminator Loss: 0.6979... Generator Loss: 1.5440
Epoch 1/1... Discriminator Loss: 2.7468... Generator Loss: 0.1405
Epoch 1/1... Discriminator Loss: 1.2424... Generator Loss: 0.6037
Epoch 1/1... Discriminator Loss: 0.7735... Generator Loss: 1.5663
Epoch 1/1... Discriminator Loss: 0.6191... Generator Loss: 1.8840
Epoch 1/1... Discriminator Loss: 1.0037... Generator Loss: 1.0633
Epoch 1/1... Discriminator Loss: 1.1910... Generator Loss: 0.7292
Epoch 1/1... Discriminator Loss: 0.6333... Generator Loss: 1.9704
Epoch 1/1... Discriminator Loss: 0.8853... Generator Loss: 1.2550
Epoch 1/1... Discriminator Loss: 1.5610... Generator Loss: 0.4673
Epoch 1/1... Discriminator Loss: 0.9837... Generator Loss: 1.0153
Epoch 1/1... Discriminator Loss: 1.6968... Generator Loss: 2.6555
Epoch 1/1... Discriminator Loss: 0.9048... Generator Loss: 1.0305
Epoch 1/1... Discriminator Loss: 1.8896... Generator Loss: 0.3916
Epoch 1/1... Discriminator Loss: 0.7560... Generator Loss: 1.5360
Epoch 1/1... Discriminator Loss: 0.9847... Generator Loss: 0.9005
Epoch 1/1... Discriminator Loss: 0.6448... Generator Loss: 2.2426
Epoch 1/1... Discriminator Loss: 0.4175... Generator Loss: 3.3422
Epoch 1/1... Discriminator Loss: 0.4696... Generator Loss: 3.3801
Epoch 1/1... Discriminator Loss: 0.7073... Generator Loss: 2.7922
Epoch 1/1... Discriminator Loss: 0.6380... Generator Loss: 1.9433
Epoch 1/1... Discriminator Loss: 0.9324... Generator Loss: 0.9715
Epoch 1/1... Discriminator Loss: 0.5703... Generator Loss: 1.9790
Epoch 1/1... Discriminator Loss: 0.7245... Generator Loss: 1.5305
Epoch 1/1... Discriminator Loss: 0.4486... Generator Loss: 4.2486
Epoch 1/1... Discriminator Loss: 0.7335... Generator Loss: 2.0541
Epoch 1/1... Discriminator Loss: 0.6059... Generator Loss: 2.1051
Epoch 1/1... Discriminator Loss: 0.4189... Generator Loss: 3.4016
Epoch 1/1... Discriminator Loss: 0.9247... Generator Loss: 1.4229
Epoch 1/1... Discriminator Loss: 0.5101... Generator Loss: 2.3809
Epoch 1/1... Discriminator Loss: 1.6972... Generator Loss: 0.6454
Epoch 1/1... Discriminator Loss: 0.5220... Generator Loss: 2.4094
Epoch 1/1... Discriminator Loss: 0.4131... Generator Loss: 3.4904
Epoch 1/1... Discriminator Loss: 0.5362... Generator Loss: 2.5455
Epoch 1/1... Discriminator Loss: 0.7431... Generator Loss: 1.4402
Epoch 1/1... Discriminator Loss: 1.8169... Generator Loss: 0.4806
Epoch 1/1... Discriminator Loss: 0.6543... Generator Loss: 1.6829
Epoch 1/1... Discriminator Loss: 1.3590... Generator Loss: 2.9818
Epoch 1/1... Discriminator Loss: 0.5025... Generator Loss: 2.4853
Epoch 1/1... Discriminator Loss: 1.7481... Generator Loss: 0.4964
Epoch 1/1... Discriminator Loss: 0.5268... Generator Loss: 2.1870
Epoch 1/1... Discriminator Loss: 1.2996... Generator Loss: 0.7593
Epoch 1/1... Discriminator Loss: 0.7041... Generator Loss: 1.6734
Epoch 1/1... Discriminator Loss: 0.4870... Generator Loss: 2.4490
Epoch 1/1... Discriminator Loss: 0.6232... Generator Loss: 2.1987
Epoch 1/1... Discriminator Loss: 1.6342... Generator Loss: 0.4910
Epoch 1/1... Discriminator Loss: 0.4872... Generator Loss: 2.4268
Training done!
CPU times: user 5min 41s, sys: 59.2 s, total: 6min 41s
Wall time: 7min 59s
In [119]:
%%time
batch_size = 64
z_dim = 100
# original 0.0005
learning_rate = 0.001
beta1 = 0.4
# use batch_images * (2.0/(np.max(batch_images) - np.min(batch_images))) normalize input images
# generator kernel size change to 3
# disc first layer kernel size change to 3
# disc second layer kernel size change to 3
# disc final layer kernel size change to 3

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 3.5136... Generator Loss: 8.5369
Epoch 1/1... Discriminator Loss: 2.9139... Generator Loss: 5.7074
Epoch 1/1... Discriminator Loss: 0.9755... Generator Loss: 2.0297
Epoch 1/1... Discriminator Loss: 2.1624... Generator Loss: 4.8693
Epoch 1/1... Discriminator Loss: 2.3664... Generator Loss: 6.4357
Epoch 1/1... Discriminator Loss: 0.5970... Generator Loss: 3.1111
Epoch 1/1... Discriminator Loss: 1.9661... Generator Loss: 0.3266
Epoch 1/1... Discriminator Loss: 1.3907... Generator Loss: 8.4685
Epoch 1/1... Discriminator Loss: 0.7932... Generator Loss: 1.5844
Epoch 1/1... Discriminator Loss: 0.6402... Generator Loss: 2.3365
Epoch 1/1... Discriminator Loss: 0.6296... Generator Loss: 2.0496
Epoch 1/1... Discriminator Loss: 0.5019... Generator Loss: 3.4779
Epoch 1/1... Discriminator Loss: 0.3894... Generator Loss: 5.0169
Epoch 1/1... Discriminator Loss: 0.6119... Generator Loss: 2.1243
Epoch 1/1... Discriminator Loss: 1.0386... Generator Loss: 0.9951
Epoch 1/1... Discriminator Loss: 0.4201... Generator Loss: 6.3752
Epoch 1/1... Discriminator Loss: 3.2278... Generator Loss: 7.3274
Epoch 1/1... Discriminator Loss: 0.4047... Generator Loss: 4.3179
Epoch 1/1... Discriminator Loss: 0.8676... Generator Loss: 4.0206
Epoch 1/1... Discriminator Loss: 0.5684... Generator Loss: 2.3926
Epoch 1/1... Discriminator Loss: 0.7324... Generator Loss: 1.7538
Epoch 1/1... Discriminator Loss: 1.1272... Generator Loss: 1.1374
Epoch 1/1... Discriminator Loss: 0.5812... Generator Loss: 2.2146
Epoch 1/1... Discriminator Loss: 0.4966... Generator Loss: 3.6057
Epoch 1/1... Discriminator Loss: 1.6030... Generator Loss: 0.5155
Epoch 1/1... Discriminator Loss: 0.5419... Generator Loss: 2.9075
Epoch 1/1... Discriminator Loss: 0.8311... Generator Loss: 1.6582
Epoch 1/1... Discriminator Loss: 0.4039... Generator Loss: 4.4393
Epoch 1/1... Discriminator Loss: 0.4978... Generator Loss: 3.3344
Epoch 1/1... Discriminator Loss: 2.0164... Generator Loss: 0.3586
Epoch 1/1... Discriminator Loss: 1.5022... Generator Loss: 0.6034
Epoch 1/1... Discriminator Loss: 0.3775... Generator Loss: 5.5861
Epoch 1/1... Discriminator Loss: 1.7602... Generator Loss: 4.2504
Epoch 1/1... Discriminator Loss: 0.5081... Generator Loss: 2.6719
Epoch 1/1... Discriminator Loss: 0.7207... Generator Loss: 1.3318
Epoch 1/1... Discriminator Loss: 0.6838... Generator Loss: 2.0445
Epoch 1/1... Discriminator Loss: 0.4976... Generator Loss: 3.6515
Epoch 1/1... Discriminator Loss: 0.6537... Generator Loss: 1.8171
Epoch 1/1... Discriminator Loss: 0.5982... Generator Loss: 2.3358
Epoch 1/1... Discriminator Loss: 0.4743... Generator Loss: 3.0068
Epoch 1/1... Discriminator Loss: 0.5450... Generator Loss: 2.4659
Epoch 1/1... Discriminator Loss: 0.5547... Generator Loss: 2.8157
Epoch 1/1... Discriminator Loss: 0.3830... Generator Loss: 4.6525
Epoch 1/1... Discriminator Loss: 0.6813... Generator Loss: 2.5812
Epoch 1/1... Discriminator Loss: 0.6745... Generator Loss: 1.7339
Epoch 1/1... Discriminator Loss: 0.4381... Generator Loss: 3.5695
Epoch 1/1... Discriminator Loss: 0.4207... Generator Loss: 3.4412
Epoch 1/1... Discriminator Loss: 1.1351... Generator Loss: 1.0946
Epoch 1/1... Discriminator Loss: 0.5206... Generator Loss: 2.2564
Epoch 1/1... Discriminator Loss: 0.6667... Generator Loss: 4.2015
Epoch 1/1... Discriminator Loss: 1.2536... Generator Loss: 0.6521
Epoch 1/1... Discriminator Loss: 0.8406... Generator Loss: 1.4123
Epoch 1/1... Discriminator Loss: 0.4840... Generator Loss: 3.8890
Epoch 1/1... Discriminator Loss: 0.5255... Generator Loss: 2.9220
Epoch 1/1... Discriminator Loss: 0.4636... Generator Loss: 3.2381
Epoch 1/1... Discriminator Loss: 0.4086... Generator Loss: 3.7285
Epoch 1/1... Discriminator Loss: 1.6291... Generator Loss: 0.4325
Epoch 1/1... Discriminator Loss: 0.8999... Generator Loss: 1.7778
Epoch 1/1... Discriminator Loss: 2.3420... Generator Loss: 0.2453
Epoch 1/1... Discriminator Loss: 0.5894... Generator Loss: 2.0315
Epoch 1/1... Discriminator Loss: 0.8297... Generator Loss: 1.6452
Epoch 1/1... Discriminator Loss: 0.4247... Generator Loss: 4.2011
Epoch 1/1... Discriminator Loss: 1.1168... Generator Loss: 0.8509
Epoch 1/1... Discriminator Loss: 0.6784... Generator Loss: 1.8604
Epoch 1/1... Discriminator Loss: 0.5956... Generator Loss: 2.0016
Epoch 1/1... Discriminator Loss: 0.8313... Generator Loss: 1.3043
Epoch 1/1... Discriminator Loss: 0.9586... Generator Loss: 1.7007
Epoch 1/1... Discriminator Loss: 1.8787... Generator Loss: 4.0909
Epoch 1/1... Discriminator Loss: 0.7050... Generator Loss: 1.7165
Epoch 1/1... Discriminator Loss: 0.5668... Generator Loss: 2.4861
Epoch 1/1... Discriminator Loss: 0.4812... Generator Loss: 2.5645
Epoch 1/1... Discriminator Loss: 1.0849... Generator Loss: 0.9116
Epoch 1/1... Discriminator Loss: 0.5694... Generator Loss: 2.0831
Epoch 1/1... Discriminator Loss: 0.9366... Generator Loss: 1.1113
Epoch 1/1... Discriminator Loss: 0.9963... Generator Loss: 1.0050
Epoch 1/1... Discriminator Loss: 1.0734... Generator Loss: 1.5390
Epoch 1/1... Discriminator Loss: 0.5349... Generator Loss: 2.7027
Epoch 1/1... Discriminator Loss: 0.5523... Generator Loss: 2.1967
Epoch 1/1... Discriminator Loss: 1.2611... Generator Loss: 0.8293
Epoch 1/1... Discriminator Loss: 0.4331... Generator Loss: 3.4545
Epoch 1/1... Discriminator Loss: 2.0133... Generator Loss: 4.5285
Epoch 1/1... Discriminator Loss: 0.6596... Generator Loss: 2.8816
Epoch 1/1... Discriminator Loss: 2.0480... Generator Loss: 0.4574
Epoch 1/1... Discriminator Loss: 0.6902... Generator Loss: 1.5840
Epoch 1/1... Discriminator Loss: 0.8452... Generator Loss: 1.1941
Epoch 1/1... Discriminator Loss: 0.4265... Generator Loss: 4.5410
Epoch 1/1... Discriminator Loss: 0.7190... Generator Loss: 1.9257
Epoch 1/1... Discriminator Loss: 0.4574... Generator Loss: 2.7635
Epoch 1/1... Discriminator Loss: 0.4772... Generator Loss: 3.7434
Epoch 1/1... Discriminator Loss: 0.4943... Generator Loss: 2.8093
Epoch 1/1... Discriminator Loss: 1.2780... Generator Loss: 0.9327
Epoch 1/1... Discriminator Loss: 1.0376... Generator Loss: 0.9133
Epoch 1/1... Discriminator Loss: 0.7069... Generator Loss: 2.1264
Epoch 1/1... Discriminator Loss: 0.7647... Generator Loss: 1.4453
Epoch 1/1... Discriminator Loss: 0.5326... Generator Loss: 2.2062
Epoch 1/1... Discriminator Loss: 1.0542... Generator Loss: 1.9405
Epoch 1/1... Discriminator Loss: 1.2700... Generator Loss: 0.8031
Epoch 1/1... Discriminator Loss: 0.4904... Generator Loss: 3.3379
Epoch 1/1... Discriminator Loss: 2.7371... Generator Loss: 4.0856
Epoch 1/1... Discriminator Loss: 0.6112... Generator Loss: 3.8767
Epoch 1/1... Discriminator Loss: 0.8114... Generator Loss: 1.9304
Epoch 1/1... Discriminator Loss: 0.6480... Generator Loss: 2.2139
Epoch 1/1... Discriminator Loss: 0.5284... Generator Loss: 2.4906
Training done!
CPU times: user 5min 39s, sys: 57.8 s, total: 6min 37s
Wall time: 8min

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.